Skip to content

[Misc] Split the LoRA code#30253

Merged
DarkLight1337 merged 4 commits intovllm-project:mainfrom
jeejeelee:improve-lora-code
Dec 8, 2025
Merged

[Misc] Split the LoRA code#30253
DarkLight1337 merged 4 commits intovllm-project:mainfrom
jeejeelee:improve-lora-code

Conversation

@jeejeelee
Copy link
Copy Markdown
Collaborator

@jeejeelee jeejeelee commented Dec 8, 2025

Purpose

For better readability, model.py is mainly split into two scripts: lora_model.py and model_manager.py

Test Result


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
@jeejeelee jeejeelee marked this pull request as draft December 8, 2025 10:25
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the LoRA code by splitting the LoRAModel class into its own file, vllm/lora/lora_model.py, and renaming vllm/lora/models.py to vllm/lora/model_manager.py. The changes are mostly about moving code and updating imports.

I've found a few issues with incorrect imports in the test files that will cause ImportErrors, and a case of code duplication. Please see my detailed comments for suggestions.

from vllm.config.load import LoadConfig
from vllm.config.lora import LoRAConfig
from vllm.lora.models import LoRAMapping
from vllm.lora.lora_model import LoRAMapping
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

LoRAMapping is not available in vllm.lora.lora_model, which will lead to an ImportError. It seems you intended to import it from vllm.lora.model_manager, which is the new name for vllm.lora.models where it was previously available.

Suggested change
from vllm.lora.lora_model import LoRAMapping
from vllm.lora.model_manager import LoRAMapping

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

VocabParallelEmbeddingWithLoRA,
)
from vllm.lora.models import LoRALayerWeights, PackedLoRALayerWeights
from vllm.lora.lora_model import LoRALayerWeights, PackedLoRALayerWeights
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Fix PackedLoRALayerWeights import after module split

tests/lora/test_layers.py now imports LoRALayerWeights, PackedLoRALayerWeights from vllm.lora.lora_model, but lora_model.py only imports LoRALayerWeights and never exposes PackedLoRALayerWeights, so importing this test (or any code using that path) will raise ImportError before the suite runs; previously vllm.lora.models re-exported both classes.

Useful? React with 👍 / 👎.

from vllm.config.load import LoadConfig
from vllm.config.lora import LoRAConfig
from vllm.lora.models import LoRAMapping
from vllm.lora.lora_model import LoRAMapping
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Correct LoRAMapping import target

tests/lora/test_worker.py now imports LoRAMapping from vllm.lora.lora_model, but that module no longer defines or re-exports LoRAMapping (it only contains the LoRAModel helpers), so the test module fails to import and the worker LoRA path cannot run; LoRAMapping is still provided by vllm.lora.model_manager/vllm.lora.layers and should be imported from there instead.

Useful? React with 👍 / 👎.

@jeejeelee jeejeelee marked this pull request as ready for review December 8, 2025 13:09
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
@DarkLight1337
Copy link
Copy Markdown
Member

/gemini review

@DarkLight1337 DarkLight1337 added the ready ONLY add when PR is ready to merge/full CI is needed label Dec 8, 2025
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the LoRA-related code by splitting the vllm/lora/models.py file into vllm/lora/lora_model.py and vllm/lora/model_manager.py for better readability and organization. The changes primarily involve moving the LoRAModel class to its own file and updating all relevant imports across the codebase. The refactoring is clean, but it has introduced a small issue of code duplication, which I've commented on.

Comment on lines +26 to +32
_GLOBAL_LORA_ID = 0


def get_lora_id():
global _GLOBAL_LORA_ID
_GLOBAL_LORA_ID += 1
return _GLOBAL_LORA_ID
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The _GLOBAL_LORA_ID variable and get_lora_id function are duplicated in vllm/lora/model_manager.py. This duplication was likely introduced during refactoring. The function in model_manager.py is unused. To prevent potential bugs and confusion from two separate global counters, the duplicated code in model_manager.py should be removed, making this file the single source of truth for get_lora_id.

@mergify
Copy link
Copy Markdown

mergify bot commented Dec 8, 2025

This pull request has merge conflicts that must be resolved before it can be
merged. Please rebase the PR, @jeejeelee.

https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/syncing-a-fork

Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
@mergify mergify bot removed the needs-rebase label Dec 8, 2025
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
@DarkLight1337 DarkLight1337 merged commit 67312ca into vllm-project:main Dec 8, 2025
49 checks passed
@jeejeelee jeejeelee deleted the improve-lora-code branch December 9, 2025 00:04
@Yatogaii
Copy link
Copy Markdown

This PR might break veRL under certain circumstances.

ModuleNotFoundError: No module named 'vllm.lora.models'

HollowMan6 added a commit to HollowMan6/verl that referenced this pull request Dec 21, 2025
Related to vllm-project/vllm#30253 (comment)

Since `LoRAModel` import path was changed from `vllm.lora.models`
to `vllm.lora.lora_model`, let's just remove the use of `LoRAModel`
now as it's only for type annotation, and we still leave the comment
for annotating the type.

Signed-off-by: Hollow Man <hollowman@opensuse.org>
HollowMan6 added a commit to HollowMan6/verl that referenced this pull request Dec 21, 2025
Related to vllm-project/vllm#30253 (comment)

`LoRAModel` import path was changed from `vllm.lora.models`
to `vllm.lora.lora_model`.

Signed-off-by: Hollow Man <hollowman@opensuse.org>
HollowMan6 added a commit to HollowMan6/verl that referenced this pull request Dec 21, 2025
Related to vllm-project/vllm#30253 (comment)

`LoRAModel` import path was changed from `vllm.lora.models`
to `vllm.lora.lora_model`.

Signed-off-by: Hollow Man <hollowman@opensuse.org>
dsuhinin pushed a commit to dsuhinin/vllm that referenced this pull request Jan 21, 2026
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
Signed-off-by: dsuhinin <suhinin.dmitriy@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants